Compute Accuracy Values

This algorithm computes accuracy assessment values and stores them in variables.

Related to algorithm Export Confusion Matrix and Tools > Accuracy Assessment to calculate and visualize a confusion matrix. Description of workflow in User Guide Developer > Tools > Tools.

See also this video - Accuracy Assessment

Related to this tutorial - Sample Statistics and Accuracy Assessment Tool

Supported Domains

Image object level;

Algorithm Parameters

Input data

Select if an image object level (defined in the domain) or an existing confusion matrix (previously exported) is used as input for the calculation. Choose between:

Class filter

Select the ground truth classes. (Only available for Input data > Image object level.)

Ground truth

Select the source of the ground truth information. (Only available for Input data > Image object level.) Choose if the ground truth information is stored in a:

Object variable

Select the object variable where the ground truth information is stored. (Only available for selection of Input data > Image object level and Ground truth > Object variable.)

Input path

Select the path of an existing confusion matrix to be used as input for the computation. Only available for Input data > Exported confusion matrix.

Cohen's Kappa

Define a variable name to store Cohen's Kappa coefficient. The Kappa coefficient measures the agreement between two sets - a classification result and ground truth objects/samples, while correcting for agreement that occurs by chance. This statistic is useful for measuring the predictive accuracy in a classification matrix. The Kappa coefficient can range from [-1 , 1], where a value for K = 0 is obtained when the agreement equals chance agreement and the upper limit of Kappa (+ 1.00) occurs only when there is perfect agreement between the two sets.

Overall accuracy

Define variable name to store the overall accuracy. The overall accuracy calculates the proportion of pixels out the reference sites/ground truth sites that were mapped correctly. The overall accuracy is usually expressed in percent, with 100% accuracy being a perfect classification - all reference objects were classified correctly. Range [0,1].